9 research outputs found

    Fly Photoreceptors Demonstrate Energy-Information Trade-Offs in Neural Coding

    Get PDF
    Trade-offs between energy consumption and neuronal performance must shape the design and evolution of nervous systems, but we lack empirical data showing how neuronal energy costs vary according to performance. Using intracellular recordings from the intact retinas of four flies, Drosophila melanogaster, D. virilis, Calliphora vicina, and Sarcophaga carnaria, we measured the rates at which homologous R1–6 photoreceptors of these species transmit information from the same stimuli and estimated the energy they consumed. In all species, both information rate and energy consumption increase with light intensity. Energy consumption rises from a baseline, the energy required to maintain the dark resting potential. This substantial fixed cost, ∼20% of a photoreceptor's maximum consumption, causes the unit cost of information (ATP molecules hydrolysed per bit) to fall as information rate increases. The highest information rates, achieved at bright daylight levels, differed according to species, from ∼200 bits s(−1) in D. melanogaster to ∼1,000 bits s(−1) in S. carnaria. Comparing species, the fixed cost, the total cost of signalling, and the unit cost (cost per bit) all increase with a photoreceptor's highest information rate to make information more expensive in higher performance cells. This law of diminishing returns promotes the evolution of economical structures by severely penalising overcapacity. Similar relationships could influence the function and design of many neurons because they are subject to similar biophysical constraints on information throughput

    Visual Coding in Locust Photoreceptors

    Get PDF
    Information capture by photoreceptors ultimately limits the quality of visual processing in the brain. Using conventional sharp microelectrodes, we studied how locust photoreceptors encode random (white-noise, WN) and naturalistic (1/f stimuli, NS) light patterns in vivo and how this coding changes with mean illumination and ambient temperature. We also examined the role of their plasma membrane in shaping voltage responses. We found that brightening or warming increase and accelerate voltage responses, but reduce noise, enabling photoreceptors to encode more information. For WN stimuli, this was accompanied by broadening of the linear frequency range. On the contrary, with NS the signaling took place within a constant bandwidth, possibly revealing a ‘preference’ for inputs with 1/f statistics. The faster signaling was caused by acceleration of the elementary phototransduction current - leading to bumps - and their distribution. The membrane linearly translated phototransduction currents into voltage responses without limiting the throughput of these messages. As the bumps reflected fast changes in membrane resistance, the data suggest that their shape is predominantly driven by fast changes in the light-gated conductance. On the other hand, the slower bump latency distribution is likely to represent slower enzymatic intracellular reactions. Furthermore, the Q10s of bump duration and latency distribution depended on light intensity. Altogether, this study suggests that biochemical constraints imposed upon signaling change continuously as locust photoreceptors adapt to environmental light and temperature conditions

    Network adaptation improves temporal representation of naturalistic stimuli in drosophila eye: II Mechanisms

    Get PDF
    Retinal networks must adapt constantly to best present the ever changing visual world to the brain. Here we test the hypothesis that adaptation is a result of different mechanisms at several synaptic connections within the network. In a companion paper (Part I), we showed that adaptation in the photoreceptors (R1-R6) and large monopolar cells (LMC) of the Drosophila eye improves sensitivity to under-represented signals in seconds by enhancing both the amplitude and frequency distribution of LMCs' voltage responses to repeated naturalistic contrast series. In this paper, we show that such adaptation needs both the light-mediated conductance and feedback-mediated synaptic conductance. A faulty feedforward pathway in histamine receptor mutant flies speeds up the LMC output, mimicking extreme light adaptation. A faulty feedback pathway from L2 LMCs to photoreceptors slows down the LMC output, mimicking dark adaptation. These results underline the importance of network adaptation for efficient coding, and as a mechanism for selectively regulating the size and speed of signals in neurons. We suggest that concert action of many different mechanisms and neural connections are responsible for adaptation to visual stimuli. Further, our results demonstrate the need for detailed circuit reconstructions like that of the Drosophila lamina, to understand how networks process information

    Power-Law Inter-Spike Interval Distributions Infer a Conditional Maximization of Entropy in Cortical Neurons

    Get PDF
    The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI), which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously) and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE) solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates) at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains

    Noise in the nervous system

    No full text
    corecore